Entropy

In information theory, entropy is a measure of uncertainty.

\[ H(X) \coloneqq - \mathbb{E}[\log{p(X)}] \]

\[ H(X) \coloneqq -\sum_{x\in \mathcal{X}} p(x) \log{p(X)} \]